135 research outputs found

    Multidimensional Binary Vector Assignment problem: standard, structural and above guarantee parameterizations

    Full text link
    In this article we focus on the parameterized complexity of the Multidimensional Binary Vector Assignment problem (called \BVA). An input of this problem is defined by mm disjoint sets V1,V2,,VmV^1, V^2, \dots, V^m, each composed of nn binary vectors of size pp. An output is a set of nn disjoint mm-tuples of vectors, where each mm-tuple is obtained by picking one vector from each set ViV^i. To each mm-tuple we associate a pp dimensional vector by applying the bit-wise AND operation on the mm vectors of the tuple. The objective is to minimize the total number of zeros in these nn vectors. mBVA can be seen as a variant of multidimensional matching where hyperedges are implicitly locally encoded via labels attached to vertices, but was originally introduced in the context of integrated circuit manufacturing. We provide for this problem FPT algorithms and negative results (ETHETH-based results, WW[2]-hardness and a kernel lower bound) according to several parameters: the standard parameter kk i.e. the total number of zeros), as well as two parameters above some guaranteed values.Comment: 16 pages, 6 figure

    Streaming Kernelization

    Full text link
    Kernelization is a formalization of preprocessing for combinatorially hard problems. We modify the standard definition for kernelization, which allows any polynomial-time algorithm for the preprocessing, by requiring instead that the preprocessing runs in a streaming setting and uses O(poly(k)logx)\mathcal{O}(poly(k)\log|x|) bits of memory on instances (x,k)(x,k). We obtain several results in this new setting, depending on the number of passes over the input that such a streaming kernelization is allowed to make. Edge Dominating Set turns out as an interesting example because it has no single-pass kernelization but two passes over the input suffice to match the bounds of the best standard kernelization

    Efficient FPT algorithms for (strict) compatibility of unrooted phylogenetic trees

    Full text link
    In phylogenetics, a central problem is to infer the evolutionary relationships between a set of species XX; these relationships are often depicted via a phylogenetic tree -- a tree having its leaves univocally labeled by elements of XX and without degree-2 nodes -- called the "species tree". One common approach for reconstructing a species tree consists in first constructing several phylogenetic trees from primary data (e.g. DNA sequences originating from some species in XX), and then constructing a single phylogenetic tree maximizing the "concordance" with the input trees. The so-obtained tree is our estimation of the species tree and, when the input trees are defined on overlapping -- but not identical -- sets of labels, is called "supertree". In this paper, we focus on two problems that are central when combining phylogenetic trees into a supertree: the compatibility and the strict compatibility problems for unrooted phylogenetic trees. These problems are strongly related, respectively, to the notions of "containing as a minor" and "containing as a topological minor" in the graph community. Both problems are known to be fixed-parameter tractable in the number of input trees kk, by using their expressibility in Monadic Second Order Logic and a reduction to graphs of bounded treewidth. Motivated by the fact that the dependency on kk of these algorithms is prohibitively large, we give the first explicit dynamic programming algorithms for solving these problems, both running in time 2O(k2)n2^{O(k^2)} \cdot n, where nn is the total size of the input.Comment: 18 pages, 1 figur

    Upper and Lower Bounds for Weak Backdoor Set Detection

    Full text link
    We obtain upper and lower bounds for running times of exponential time algorithms for the detection of weak backdoor sets of 3CNF formulas, considering various base classes. These results include (omitting polynomial factors), (i) a 4.54^k algorithm to detect whether there is a weak backdoor set of at most k variables into the class of Horn formulas; (ii) a 2.27^k algorithm to detect whether there is a weak backdoor set of at most k variables into the class of Krom formulas. These bounds improve an earlier known bound of 6^k. We also prove a 2^k lower bound for these problems, subject to the Strong Exponential Time Hypothesis.Comment: A short version will appear in the proceedings of the 16th International Conference on Theory and Applications of Satisfiability Testin

    Slightly Superexponential Parameterized Problems

    Get PDF
    A central problem in parameterized algorithms is to obtain algorithms with running time f(k) center dot n(O(1)) such that f is as slow growing a function of the parameter k as possible. In particular, a large number of basic parameterized problems admit parameterized algorithms where f (k) is single-exponential, that is, c(k) for some constant c, which makes aiming for such a running time a natural goal for other problems as well. However, there are still plenty of problems where the f(k) appearing in the best-known running time is worse than single-exponential and it remained "slightly superexponential" even after serious attempts to bring it down. A natural question to ask is whether the f (k) appearing in the running time of the best-known algorithms is optimal for any of _ these problems. In this paper, we examine parameterized problems where f(k) is k(O(k)) = 2(O(k log k)) in the best-known running time, and for a number of such problems we show that the dependence on k in the running time cannot be improved to single-exponential. More precisely we prove the following tight lower bounds, for four natural problems, arising from three different domains: (1) In the CLOSEST STRING problem, given strings S-1,..., s(t) over an alphabet Sigma of length L each, and an integer d, the question is whether there exists a string s over E of length L, such that its hamming distance from each of the strings s,, 1 <= i <= t, is at most d. The pattern matching problem CLOSEST STRING is known to be solvable in times 2(O(d log d)) center dot n(O(1)) and 2(O(d log vertical bar Sigma vertical bar)) center dot n(O(1)). We show that there are no 2(O(d log d)) center dot n(O(1)) or 2(O(d log vertical bar Sigma vertical bar)) time algorithms, unless the Exponential Time Hypothesis (ETH) fails. (2) The graph embedding problem DISTORTION, that is, deciding whether a graph G has a metric embedding into the integers with distortion at most d can be solved in time 2(O(d log d)) center dot n(O(1)). We show that there is no 2(O(w log w)) center dot n(O(1)) time algorithm, unless the ETH fails. (3) The DISJOINT PATHS problem can be solved in time 2(O(w log w)) center dot n(O(1)) on graphs of treewidth at most w. We show that there is no 2(O(w log w)) center dot n(O(1)) time algorithm, unless the ETH fails. (4) The CHROMATIC NUMBER problem can be solved in time 2(O(w log w)) center dot n(O(1)) on graphs of treewidth at most w. We show that there is no 2(O(w log w)) center dot n(O(1)) time algorithm, unless the ETH fails. To obtain our results, we first prove the lower bound for variants of basic problems: finding cliques, independent sets, and hitting sets. These artificially constrained variants form a good starting point for proving lower bounds on natural problems without any technical restrictions and could be of independent interest. Several follow-up works have already obtained tight lower bounds by using our framework, and we believe it will prove useful in obtaining even more lower bounds in the future

    Relating the Time Complexity of Optimization Problems in Light of the Exponential-Time Hypothesis

    Full text link
    Obtaining lower bounds for NP-hard problems has for a long time been an active area of research. Recent algebraic techniques introduced by Jonsson et al. (SODA 2013) show that the time complexity of the parameterized SAT(\cdot) problem correlates to the lattice of strong partial clones. With this ordering they isolated a relation RR such that SAT(RR) can be solved at least as fast as any other NP-hard SAT(\cdot) problem. In this paper we extend this method and show that such languages also exist for the max ones problem (MaxOnes(Γ\Gamma)) and the Boolean valued constraint satisfaction problem over finite-valued constraint languages (VCSP(Δ\Delta)). With the help of these languages we relate MaxOnes and VCSP to the exponential time hypothesis in several different ways.Comment: This is an extended version of Relating the Time Complexity of Optimization Problems in Light of the Exponential-Time Hypothesis, appearing in Proceedings of the 39th International Symposium on Mathematical Foundations of Computer Science MFCS 2014 Budapest, August 25-29, 201

    Known Algorithms on Graphs of Bounded Treewidth Are Probably Optimal

    Get PDF
    We obtain a number of lower bounds on the running time of algoritluns solving problems on graphs of bounded treewidth. We prove the results under the Strong Exponential Time Hypothesis of Impagliazzo and Paturi. In particular, assuming that n-variable m-clause SAT cannot be solved in time (2 - epsilon)(n) m(O(1)), we show that for any epsilon > 0: INDEPENDENT SET cannot be solved ill time (2 - epsilon)(tw(G))vertical bar V(G)vertical bar(O(1)), DOMINATING SET cannot be solved in time (3 - epsilon)(tw(G))vertical bar V(G)vertical bar(O(1)), MAX CUT cannot be solved in time (2 - epsilon)(tw(G))vertical bar V(G)vertical bar(O(1)), ODD CYCLE TRANSVERSAL cannot be solved in lime (3 - epsilon)(tw(G))vertical bar V(G)vertical bar(O(1)) For ally fixed q >= 3, q-COLORING cannot be solved in time (q - epsilon)(tw(G)) vertical bar V(G)vertical bar(O(1)), PARTITION INTO TRIANGLES cannot be solved in time (2 - epsilon)(tw(G))vertical bar V(G)vertical bar(O(1)). Our lower bounds match the running times for the best known algoritluns for the problems, up to the epsilon in the base

    Hitting forbidden subgraphs in graphs of bounded treewidth

    Get PDF
    We study the complexity of a generic hitting problem H-Subgraph Hitting, where given a fixed pattern graph HH and an input graph GG, the task is to find a set XV(G)X \subseteq V(G) of minimum size that hits all subgraphs of GG isomorphic to HH. In the colorful variant of the problem, each vertex of GG is precolored with some color from V(H)V(H) and we require to hit only HH-subgraphs with matching colors. Standard techniques shows that for every fixed HH, the problem is fixed-parameter tractable parameterized by the treewidth of GG; however, it is not clear how exactly the running time should depend on treewidth. For the colorful variant, we demonstrate matching upper and lower bounds showing that the dependence of the running time on treewidth of GG is tightly governed by μ(H)\mu(H), the maximum size of a minimal vertex separator in HH. That is, we show for every fixed HH that, on a graph of treewidth tt, the colorful problem can be solved in time 2O(tμ(H))V(G)2^{\mathcal{O}(t^{\mu(H)})}\cdot|V(G)|, but cannot be solved in time 2o(tμ(H))V(G)O(1)2^{o(t^{\mu(H)})}\cdot |V(G)|^{O(1)}, assuming the Exponential Time Hypothesis (ETH). Furthermore, we give some preliminary results showing that, in the absence of colors, the parameterized complexity landscape of H-Subgraph Hitting is much richer.Comment: A full version of a paper presented at MFCS 201

    Covering Problems for Partial Words and for Indeterminate Strings

    Full text link
    We consider the problem of computing a shortest solid cover of an indeterminate string. An indeterminate string may contain non-solid symbols, each of which specifies a subset of the alphabet that could be present at the corresponding position. We also consider covering partial words, which are a special case of indeterminate strings where each non-solid symbol is a don't care symbol. We prove that indeterminate string covering problem and partial word covering problem are NP-complete for binary alphabet and show that both problems are fixed-parameter tractable with respect to kk, the number of non-solid symbols. For the indeterminate string covering problem we obtain a 2O(klogk)+nkO(1)2^{O(k \log k)} + n k^{O(1)}-time algorithm. For the partial word covering problem we obtain a 2O(klogk)+nkO(1)2^{O(\sqrt{k}\log k)} + nk^{O(1)}-time algorithm. We prove that, unless the Exponential Time Hypothesis is false, no 2o(k)nO(1)2^{o(\sqrt{k})} n^{O(1)}-time solution exists for either problem, which shows that our algorithm for this case is close to optimal. We also present an algorithm for both problems which is feasible in practice.Comment: full version (simplified and corrected); preliminary version appeared at ISAAC 2014; 14 pages, 4 figure

    Bounded Search Tree Algorithms for Parameterized Cograph Deletion: Efficient Branching Rules by Exploiting Structures of Special Graph Classes

    Full text link
    Many fixed-parameter tractable algorithms using a bounded search tree have been repeatedly improved, often by describing a larger number of branching rules involving an increasingly complex case analysis. We introduce a novel and general search strategy that branches on the forbidden subgraphs of a graph class relaxation. By using the class of P4P_4-sparse graphs as the relaxed graph class, we obtain efficient bounded search tree algorithms for several parameterized deletion problems. We give the first non-trivial bounded search tree algorithms for the cograph edge-deletion problem and the trivially perfect edge-deletion problems. For the cograph vertex deletion problem, a refined analysis of the runtime of our simple bounded search algorithm gives a faster exponential factor than those algorithms designed with the help of complicated case distinctions and non-trivial running time analysis [21] and computer-aided branching rules [11].Comment: 23 pages. Accepted in Discrete Mathematics, Algorithms and Applications (DMAA
    corecore